SHARE

Teen NJ Victim Of Deepfake Porn Pleads With Lawmakers: 'It Could Happen To Anyone'

Last fall, Francesca Mani discovered a male classmate had made AI-generated nude images of her without her consent.

L to R: Rep. Tom Kean, Jr., Francesca Mani and Rep. Joe Morelle. 

L to R: Rep. Tom Kean, Jr., Francesca Mani and Rep. Joe Morelle. 

Photo Credit: Congressman Joe. Morelle Facebook

While the images were deleted and the Westfield High School student said she never saw them, Francesca still felt sad and helpless. 

Now she has found a way to turn that trauma into something empowering. 

Francesca is working with members of Congress, including Rep. Tom Kean, Jr., (R-Westfield) to pass legislation to stop the spread of deepfake pornography generated by artificial intelligence. 

According to a press release issued by New York Congressman Joe Morelle's office, deepfake pornography makes up 96 percent of all deepfakes, and they almost exclusively target women.

"What happened to me at 14 could happen to anyone," Mani said at a press conference on Tuesday, Jan. 16. "That's why it's so important to have laws in place. We are seriously vulnerable and we need your help."

HR 3106, the Preventing Deepfakes of Intimate Images Act will prohibit the non-consensual disclosure of digitally altered intimate images and makes the sharing of these images a criminal offense.

The Westfield School District was first made aware of the photos back in October and said they were no longer being circulated. Citing confidentiality issues, the district did not say how many students were involved or what disciplinary action was taken. Police were notified, the district said.

To view the legislation, click here. 

to follow Daily Voice Fair Lawn-Glen Rock and receive free news updates.

SCROLL TO NEXT ARTICLE